Incremental Subgradients for Constrained Convex Optimization: A Unified Framework and New Methods

نویسندگان

  • Elias Salomão Helou Neto
  • Alvaro R. De Pierro
چکیده

We present a unifying framework for nonsmooth convex minimization bringing together -subgradient algorithms and methods for the convex feasibility problem. This development is a natural step for -subgradient methods in the direction of constrained optimization since the Euclidean projection frequently required in such methods is replaced by an approximate projection, which is often easier to compute. The developments are applied to incremental subgradient methods, resulting in new algorithms suitable to large-scale optimization problems, such as those arising in tomographic imaging.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimate sequence methods: extensions and approximations

The approach of estimate sequence offers an interesting rereading of a number of accelerating schemes proposed by Nesterov [Nes03], [Nes05], and [Nes06]. It seems to us that this framework is the most appropriate descriptive framework to develop an analysis of the sensitivity of the schemes to approximations. We develop in this work a simple, self-contained, and unified framework for the study ...

متن کامل

Incremental Constraint Projection-Proximal Methods for Nonsmooth Convex Optimization

We consider convex optimization problems with structures that are suitable for stochastic sampling. In particular, we focus on problems where the objective function is an expected value or is a sum of a large number of component functions, and the constraint set is the intersection of a large number of simpler sets. We propose an algorithmic framework for projection-proximal methods using rando...

متن کامل

Min Common/Max Crossing Duality: A Geometric View of Conjugacy in Convex Optimization1

We provide a unifying framework for the visualization and analysis of duality, and other issues in convex optimization. It is based on two simple optimization problems that are dual to each other: the min common point problem and the max crossing point problem. Within the insightful geometry of these problems, several of the core issues in convex analysis become apparent and can be analyzed in ...

متن کامل

Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey

We survey incremental methods for minimizing a sum ∑m i=1 fi(x) consisting of a large number of convex component functions fi. Our methods consist of iterations applied to single components, and have proved very effective in practice. We introduce a unified algorithmic framework for a variety of such methods, some involving gradient and subgradient iterations, which are known, and some involvin...

متن کامل

The effect of deterministic noise in subgradient methods

In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2009